Goto

Collaborating Authors

 anatomical environment


A Dataset of Anatomical Environments for Medical Robots: Modeling Respiratory Deformation

Fried, Inbar, Hoelscher, Janine, Akulian, Jason A., Alterovitz, Ron

arXiv.org Artificial Intelligence

Anatomical models of a medical robot's environment can significantly help guide design and development of a new robotic system. These models can be used for benchmarking motion planning algorithms, evaluating controllers, optimizing mechanical design choices, simulating procedures, and even as resources for data generation. Currently, the time-consuming task of generating these environments is repeatedly performed by individual research groups and rarely shared broadly. This not only leads to redundant efforts, but also makes it challenging to compare systems and algorithms accurately. In this work, we present a collection of clinically-relevant anatomical environments for medical robots operating in the lungs. Since anatomical deformation is a fundamental challenge for medical robots operating in the lungs, we describe a way to model respiratory deformation in these environments using patient-derived data. We share the environments and deformation data publicly by adding them to the Medical Robotics Anatomical Dataset (Med-RAD), our public dataset of anatomical environments for medical robots.


Efficient and Accurate Mapping of Subsurface Anatomy via Online Trajectory Optimization for Robot Assisted Surgery

Cho, Brian Y., Kuntz, Alan

arXiv.org Artificial Intelligence

Robotic surgical subtask automation has the potential to reduce the per-patient workload of human surgeons. There are a variety of surgical subtasks that require geometric information of subsurface anatomy, such as the location of tumors, which necessitates accurate and efficient surgical sensing. In this work, we propose an automated sensing method that maps 3D subsurface anatomy to provide such geometric knowledge. We model the anatomy via a Bayesian Hilbert map-based probabilistic 3D occupancy map. Using the 3D occupancy map, we plan sensing paths on the surface of the anatomy via a graph search algorithm, $A^*$ search, with a cost function that enables the trajectories generated to balance between exploration of unsensed regions and refining the existing probabilistic understanding. We demonstrate the performance of our proposed method by comparing it against 3 different methods in several anatomical environments including a real-life CT scan dataset. The experimental results show that our method efficiently detects relevant subsurface anatomy with shorter trajectories than the comparison methods, and the resulting occupancy map achieves high accuracy.


Interactive-Rate Supervisory Control for Arbitrarily-Routed Multi-Tendon Robots via Motion Planning

Bentley, Michael, Rucker, Caleb, Kuntz, Alan

arXiv.org Artificial Intelligence

Tendon-driven robots, where one or more tendons under tension bend and manipulate a flexible backbone, can improve minimally invasive surgeries involving difficult-to-reach regions in the human body. Planning motions safely within constrained anatomical environments requires accuracy and efficiency in shape estimation and collision checking. Tendon robots that employ arbitrarily-routed tendons can achieve complex and interesting shapes, enabling them to travel to difficult-to-reach anatomical regions. Arbitrarily-routed tendon-driven robots have unintuitive nonlinear kinematics. Therefore, we envision clinicians leveraging an assistive interactive-rate motion planner to automatically generate collision-free trajectories to clinician-specified destinations during minimally-invasive surgical procedures. Standard motion-planning techniques cannot achieve interactive-rate motion planning with the current expensive tendon robot kinematic models. In this work, we present a 3-phase motion-planning system for arbitrarily-routed tendon-driven robots with a Precompute phase, a Load phase, and a Supervisory Control phase. Our system achieves an interactive rate by developing a fast kinematic model (over 1,000 times faster than current models), a fast voxel collision method (27.6 times faster than standard methods), and leveraging a precomputed roadmap of the entire robot workspace with pre-voxelized vertices and edges. In simulated experiments, we show that our motion-planning method achieves high tip-position accuracy and generates plans at 14.8 Hz on average in a segmented collapsed lung pleural space anatomical environment. Our results show that our method is 17,700 times faster than popular off-the-shelf motion planning algorithms with standard FK and collision detection approaches. Our open-source code is available online.